Search Results

Documents authored by Nederlof, Jesper


Document
Polynomial-Time Approximation of Independent Set Parameterized by Treewidth

Authors: Parinya Chalermsook, Fedor Fomin, Thekla Hamm, Tuukka Korhonen, Jesper Nederlof, and Ly Orgo

Published in: LIPIcs, Volume 274, 31st Annual European Symposium on Algorithms (ESA 2023)


Abstract
We prove the following result about approximating the maximum independent set in a graph. Informally, we show that any approximation algorithm with a "non-trivial" approximation ratio (as a function of the number of vertices of the input graph G) can be turned into an approximation algorithm achieving almost the same ratio, albeit as a function of the treewidth of G. More formally, we prove that for any function f, the existence of a polynomial time (n/f(n))-approximation algorithm yields the existence of a polynomial time O(tw⋅log{f(tw)}/f(tw))-approximation algorithm, where n and tw denote the number of vertices and the width of a given tree decomposition of the input graph. By pipelining our result with the state-of-the-art O(n ⋅ (log log n)²/log³n)-approximation algorithm by Feige (2004), this implies an O(tw⋅(log log tw)³/log³tw)-approximation algorithm.

Cite as

Parinya Chalermsook, Fedor Fomin, Thekla Hamm, Tuukka Korhonen, Jesper Nederlof, and Ly Orgo. Polynomial-Time Approximation of Independent Set Parameterized by Treewidth. In 31st Annual European Symposium on Algorithms (ESA 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 274, pp. 33:1-33:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{chalermsook_et_al:LIPIcs.ESA.2023.33,
  author =	{Chalermsook, Parinya and Fomin, Fedor and Hamm, Thekla and Korhonen, Tuukka and Nederlof, Jesper and Orgo, Ly},
  title =	{{Polynomial-Time Approximation of Independent Set Parameterized by Treewidth}},
  booktitle =	{31st Annual European Symposium on Algorithms (ESA 2023)},
  pages =	{33:1--33:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-295-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{274},
  editor =	{G{\o}rtz, Inge Li and Farach-Colton, Martin and Puglisi, Simon J. and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2023.33},
  URN =		{urn:nbn:de:0030-drops-186865},
  doi =		{10.4230/LIPIcs.ESA.2023.33},
  annote =	{Keywords: Maximum Independent Set, Treewidth, Approximation Algorithms, Parameterized Approximation}
}
Document
A Fine-Grained Classification of the Complexity of Evaluating the Tutte Polynomial on Integer Points Parameterized by Treewidth and Cutwidth

Authors: Isja Mannens and Jesper Nederlof

Published in: LIPIcs, Volume 274, 31st Annual European Symposium on Algorithms (ESA 2023)


Abstract
We give a fine-grained classification of evaluating the Tutte polynomial T(G;x,y) on all integer points on graphs with small treewidth and cutwidth. Specifically, we show for any point (x,y) ∈ ℤ² that either - T(G; x, y) can be computed in polynomial time, - T(G; x, y) can be computed in 2^O(tw) n^O(1) time, but not in 2^o(ctw) n^O(1) time assuming the Exponential Time Hypothesis (ETH), - T(G; x, y) can be computed in 2^O(tw log tw) n^O(1) time, but not in 2^o(ctw log ctw) n^O(1) time assuming the ETH, where we assume tree decompositions of treewidth tw and cutwidth decompositions of cutwidth ctw are given as input along with the input graph on n vertices and point (x,y). To obtain these results, we refine the existing reductions that were instrumental for the seminal dichotomy by Jaeger, Welsh and Vertigan [Math. Proc. Cambridge Philos. Soc'90]. One of our technical contributions is a new rank bound of a matrix that indicates whether the union of two forests is a forest itself, which we use to show that the number of forests of a graph can be counted in 2^O(tw) n^O(1) time.

Cite as

Isja Mannens and Jesper Nederlof. A Fine-Grained Classification of the Complexity of Evaluating the Tutte Polynomial on Integer Points Parameterized by Treewidth and Cutwidth. In 31st Annual European Symposium on Algorithms (ESA 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 274, pp. 82:1-82:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{mannens_et_al:LIPIcs.ESA.2023.82,
  author =	{Mannens, Isja and Nederlof, Jesper},
  title =	{{A Fine-Grained Classification of the Complexity of Evaluating the Tutte Polynomial on Integer Points Parameterized by Treewidth and Cutwidth}},
  booktitle =	{31st Annual European Symposium on Algorithms (ESA 2023)},
  pages =	{82:1--82:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-295-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{274},
  editor =	{G{\o}rtz, Inge Li and Farach-Colton, Martin and Puglisi, Simon J. and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2023.82},
  URN =		{urn:nbn:de:0030-drops-187354},
  doi =		{10.4230/LIPIcs.ESA.2023.82},
  annote =	{Keywords: Width Parameters, Parameterized Complexity, Tutte Polynomial}
}
Document
Tight Lower Bounds for Problems Parameterized by Rank-Width

Authors: Benjamin Bergougnoux, Tuukka Korhonen, and Jesper Nederlof

Published in: LIPIcs, Volume 254, 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)


Abstract
We show that there is no 2^o(k²) n^O(1) time algorithm for Independent Set on n-vertex graphs with rank-width k, unless the Exponential Time Hypothesis (ETH) fails. Our lower bound matches the 2^O(k²) n^O(1) time algorithm given by Bui-Xuan, Telle, and Vatshelle [Discret. Appl. Math., 2010] and it answers the open question of Bergougnoux and Kanté [SIAM J. Discret. Math., 2021]. We also show that the known 2^O(k²) n^O(1) time algorithms for Weighted Dominating Set, Maximum Induced Matching and Feedback Vertex Set parameterized by rank-width k are optimal assuming ETH. Our results are the first tight ETH lower bounds parameterized by rank-width that do not follow directly from lower bounds for n-vertex graphs.

Cite as

Benjamin Bergougnoux, Tuukka Korhonen, and Jesper Nederlof. Tight Lower Bounds for Problems Parameterized by Rank-Width. In 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 254, pp. 11:1-11:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bergougnoux_et_al:LIPIcs.STACS.2023.11,
  author =	{Bergougnoux, Benjamin and Korhonen, Tuukka and Nederlof, Jesper},
  title =	{{Tight Lower Bounds for Problems Parameterized by Rank-Width}},
  booktitle =	{40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)},
  pages =	{11:1--11:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-266-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{254},
  editor =	{Berenbrink, Petra and Bouyer, Patricia and Dawar, Anuj and Kant\'{e}, Mamadou Moustapha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2023.11},
  URN =		{urn:nbn:de:0030-drops-176636},
  doi =		{10.4230/LIPIcs.STACS.2023.11},
  annote =	{Keywords: rank-width, exponential time hypothesis, Boolean-width, parameterized algorithms, independent set, dominating set, maximum induced matching, feedback vertex set}
}
Document
Complete Volume
LIPIcs, Volume 249, IPEC 2022, Complete Volume

Authors: Holger Dell and Jesper Nederlof

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
LIPIcs, Volume 249, IPEC 2022, Complete Volume

Cite as

17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 1-520, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@Proceedings{dell_et_al:LIPIcs.IPEC.2022,
  title =	{{LIPIcs, Volume 249, IPEC 2022, Complete Volume}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{1--520},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022},
  URN =		{urn:nbn:de:0030-drops-173553},
  doi =		{10.4230/LIPIcs.IPEC.2022},
  annote =	{Keywords: LIPIcs, Volume 249, IPEC 2022, Complete Volume}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Holger Dell and Jesper Nederlof

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 0:i-0:xviii, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{dell_et_al:LIPIcs.IPEC.2022.0,
  author =	{Dell, Holger and Nederlof, Jesper},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{0:i--0:xviii},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.0},
  URN =		{urn:nbn:de:0030-drops-173562},
  doi =		{10.4230/LIPIcs.IPEC.2022.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
Tight Bounds for Counting Colorings and Connected Edge Sets Parameterized by Cutwidth

Authors: Carla Groenland, Isja Mannens, Jesper Nederlof, and Krisztina Szilágyi

Published in: LIPIcs, Volume 219, 39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022)


Abstract
We study the fine-grained complexity of counting the number of colorings and connected spanning edge sets parameterized by the cutwidth and treewidth of the graph. While decompositions of small treewidth decompose the graph with small vertex separators, decompositions with small cutwidth decompose the graph with small edge separators. Let p,q ∈ ℕ such that p is a prime and q ≥ 3. We show: - If p divides q-1, there is a (q-1)^{ctw}n^{O(1)} time algorithm for counting list q-colorings modulo p of n-vertex graphs of cutwidth ctw. Furthermore, there is no ε > 0 for which there is a (q-1-ε)^{ctw} n^{O(1)} time algorithm that counts the number of list q-colorings modulo p of n-vertex graphs of cutwidth ctw, assuming the Strong Exponential Time Hypothesis (SETH). - If p does not divide q-1, there is no ε > 0 for which there exists a (q-ε)^{ctw} n^{O(1)} time algorithm that counts the number of list q-colorings modulo p of n-vertex graphs of cutwidth ctw, assuming SETH. The lower bounds are in stark contrast with the existing 2^{ctw}n^{O(1)} time algorithm to compute the chromatic number of a graph by Jansen and Nederlof [Theor. Comput. Sci.'18]. Furthermore, by building upon the above lower bounds, we obtain the following lower bound for counting connected spanning edge sets: there is no ε > 0 for which there is an algorithm that, given a graph G and a cutwidth ordering of cutwidth ctw, counts the number of spanning connected edge sets of G modulo p in time (p - ε)^{ctw} n^{O(1)}, assuming SETH. We also give an algorithm with matching running time for this problem. Before our work, even for the treewidth parameterization, the best conditional lower bound by Dell et al. [ACM Trans. Algorithms'14] only excluded 2^{o(tw)}n^{O(1)} time algorithms for this problem. Both our algorithms and lower bounds employ use of the matrix rank method, by relating the complexity of the problem to the rank of a certain "compatibility matrix" in a non-trivial way.

Cite as

Carla Groenland, Isja Mannens, Jesper Nederlof, and Krisztina Szilágyi. Tight Bounds for Counting Colorings and Connected Edge Sets Parameterized by Cutwidth. In 39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 219, pp. 36:1-36:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{groenland_et_al:LIPIcs.STACS.2022.36,
  author =	{Groenland, Carla and Mannens, Isja and Nederlof, Jesper and Szil\'{a}gyi, Krisztina},
  title =	{{Tight Bounds for Counting Colorings and Connected Edge Sets Parameterized by Cutwidth}},
  booktitle =	{39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022)},
  pages =	{36:1--36:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-222-8},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{219},
  editor =	{Berenbrink, Petra and Monmege, Benjamin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2022.36},
  URN =		{urn:nbn:de:0030-drops-158464},
  doi =		{10.4230/LIPIcs.STACS.2022.36},
  annote =	{Keywords: connected edge sets, cutwidth, parameterized algorithms, colorings, counting modulo p}
}
Document
Isolation Schemes for Problems on Decomposable Graphs

Authors: Jesper Nederlof, Michał Pilipczuk, Céline M. F. Swennenhuis, and Karol Węgrzycki

Published in: LIPIcs, Volume 219, 39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022)


Abstract
The Isolation Lemma of Mulmuley, Vazirani and Vazirani [Combinatorica'87] provides a self-reduction scheme that allows one to assume that a given instance of a problem has a unique solution, provided a solution exists at all. Since its introduction, much effort has been dedicated towards derandomization of the Isolation Lemma for specific classes of problems. So far, the focus was mainly on problems solvable in polynomial time. In this paper, we study a setting that is more typical for NP-complete problems, and obtain partial derandomizations in the form of significantly decreasing the number of required random bits. In particular, motivated by the advances in parameterized algorithms, we focus on problems on decomposable graphs. For example, for the problem of detecting a Hamiltonian cycle, we build upon the rank-based approach from [Bodlaender et al., Inf. Comput.'15] and design isolation schemes that use - 𝒪(tlog n + log²{n}) random bits on graphs of treewidth at most t; - 𝒪(√n) random bits on planar or H-minor free graphs; and - 𝒪(n)-random bits on general graphs. In all these schemes, the weights are bounded exponentially in the number of random bits used. As a corollary, for every fixed H we obtain an algorithm for detecting a Hamiltonian cycle in an H-minor-free graph that runs in deterministic time 2^{𝒪(√n)} and uses polynomial space; this is the first algorithm to achieve such complexity guarantees. For problems of more local nature, such as finding an independent set of maximum size, we obtain isolation schemes on graphs of treedepth at most d that use 𝒪(d) random bits and assign polynomially-bounded weights. We also complement our findings with several unconditional and conditional lower bounds, which show that many of the results cannot be significantly improved.

Cite as

Jesper Nederlof, Michał Pilipczuk, Céline M. F. Swennenhuis, and Karol Węgrzycki. Isolation Schemes for Problems on Decomposable Graphs. In 39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 219, pp. 50:1-50:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{nederlof_et_al:LIPIcs.STACS.2022.50,
  author =	{Nederlof, Jesper and Pilipczuk, Micha{\l} and Swennenhuis, C\'{e}line M. F. and W\k{e}grzycki, Karol},
  title =	{{Isolation Schemes for Problems on Decomposable Graphs}},
  booktitle =	{39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022)},
  pages =	{50:1--50:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-222-8},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{219},
  editor =	{Berenbrink, Petra and Monmege, Benjamin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2022.50},
  URN =		{urn:nbn:de:0030-drops-158601},
  doi =		{10.4230/LIPIcs.STACS.2022.50},
  annote =	{Keywords: Isolation Lemma, Derandomization, Hamiltonian Cycle, Exact Algorithms}
}
Document
On the Fine-Grained Parameterized Complexity of Partial Scheduling to Minimize the Makespan

Authors: Jesper Nederlof and Céline M. F. Swennenhuis

Published in: LIPIcs, Volume 180, 15th International Symposium on Parameterized and Exact Computation (IPEC 2020)


Abstract
We study a natural variant of scheduling that we call partial scheduling: In this variant an instance of a scheduling problem along with an integer k is given and one seeks an optimal schedule where not all, but only k jobs, have to be processed. Specifically, we aim to determine the fine-grained parameterized complexity of partial scheduling problems parameterized by k for all variants of scheduling problems that minimize the makespan and involve unit/arbitrary processing times, identical/unrelated parallel machines, release/due dates, and precedence constraints. That is, we investigate whether algorithms with runtimes of the type f(k)n^𝒪(1) or n^𝒪(f(k)) exist for a function f that is as small as possible. Our contribution is two-fold: First, we categorize each variant to be either in 𝖯, NP-complete and fixed-parameter tractable by k, or 𝖶[1]-hard parameterized by k. Second, for many interesting cases we further investigate the run time on a finer scale and obtain run times that are (almost) optimal assuming the Exponential Time Hypothesis. As one of our main technical contributions, we give an 𝒪(8^k k(|V|+|E|)) time algorithm to solve instances of partial scheduling problems minimizing the makespan with unit length jobs, precedence constraints and release dates, where G = (V,E) is the graph with precedence constraints.

Cite as

Jesper Nederlof and Céline M. F. Swennenhuis. On the Fine-Grained Parameterized Complexity of Partial Scheduling to Minimize the Makespan. In 15th International Symposium on Parameterized and Exact Computation (IPEC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 180, pp. 25:1-25:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{nederlof_et_al:LIPIcs.IPEC.2020.25,
  author =	{Nederlof, Jesper and Swennenhuis, C\'{e}line M. F.},
  title =	{{On the Fine-Grained Parameterized Complexity of Partial Scheduling to Minimize the Makespan}},
  booktitle =	{15th International Symposium on Parameterized and Exact Computation (IPEC 2020)},
  pages =	{25:1--25:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-172-6},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{180},
  editor =	{Cao, Yixin and Pilipczuk, Marcin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2020.25},
  URN =		{urn:nbn:de:0030-drops-133287},
  doi =		{10.4230/LIPIcs.IPEC.2020.25},
  annote =	{Keywords: Fixed-Parameter Tractability, Scheduling, Precedence Constraints}
}
Document
Equal-Subset-Sum Faster Than the Meet-in-the-Middle

Authors: Marcin Mucha, Jesper Nederlof, Jakub Pawlewicz, and Karol Węgrzycki

Published in: LIPIcs, Volume 144, 27th Annual European Symposium on Algorithms (ESA 2019)


Abstract
In the Equal-Subset-Sum problem, we are given a set S of n integers and the problem is to decide if there exist two disjoint nonempty subsets A,B subseteq S, whose elements sum up to the same value. The problem is NP-complete. The state-of-the-art algorithm runs in O^*(3^(n/2)) <= O^*(1.7321^n) time and is based on the meet-in-the-middle technique. In this paper, we improve upon this algorithm and give O^*(1.7088^n) worst case Monte Carlo algorithm. This answers a question suggested by Woeginger in his inspirational survey. Additionally, we analyse the polynomial space algorithm for Equal-Subset-Sum. A naive polynomial space algorithm for Equal-Subset-Sum runs in O^*(3^n) time. With read-only access to the exponentially many random bits, we show a randomized algorithm running in O^*(2.6817^n) time and polynomial space.

Cite as

Marcin Mucha, Jesper Nederlof, Jakub Pawlewicz, and Karol Węgrzycki. Equal-Subset-Sum Faster Than the Meet-in-the-Middle. In 27th Annual European Symposium on Algorithms (ESA 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 144, pp. 73:1-73:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{mucha_et_al:LIPIcs.ESA.2019.73,
  author =	{Mucha, Marcin and Nederlof, Jesper and Pawlewicz, Jakub and W\k{e}grzycki, Karol},
  title =	{{Equal-Subset-Sum Faster Than the Meet-in-the-Middle}},
  booktitle =	{27th Annual European Symposium on Algorithms (ESA 2019)},
  pages =	{73:1--73:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-124-5},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{144},
  editor =	{Bender, Michael A. and Svensson, Ola and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2019.73},
  URN =		{urn:nbn:de:0030-drops-111946},
  doi =		{10.4230/LIPIcs.ESA.2019.73},
  annote =	{Keywords: Equal-Subset-Sum, Subset-Sum, meet-in-the-middle, enumeration technique, randomized algorithm}
}
Document
Computing the Chromatic Number Using Graph Decompositions via Matrix Rank

Authors: Bart M. P. Jansen and Jesper Nederlof

Published in: LIPIcs, Volume 112, 26th Annual European Symposium on Algorithms (ESA 2018)


Abstract
Computing the smallest number q such that the vertices of a given graph can be properly q-colored is one of the oldest and most fundamental problems in combinatorial optimization. The q-Coloring problem has been studied intensively using the framework of parameterized algorithmics, resulting in a very good understanding of the best-possible algorithms for several parameterizations based on the structure of the graph. For example, algorithms are known to solve the problem on graphs of treewidth {tw} in time O^*(q^{tw}), while a running time of O^*((q-epsilon)^{tw}) is impossible assuming the Strong Exponential Time Hypothesis (SETH). While there is an abundance of work for parameterizations based on decompositions of the graph by vertex separators, almost nothing is known about parameterizations based on edge separators. We fill this gap by studying q-Coloring parameterized by cutwidth, and parameterized by pathwidth in bounded-degree graphs. Our research uncovers interesting new ways to exploit small edge separators. We present two algorithms for q-Coloring parameterized by cutwidth {ctw}: a deterministic one that runs in time O^*(2^{omega * {ctw}}), where omega is the matrix multiplication constant, and a randomized one with runtime O^*(2^{{ctw}}). In sharp contrast to earlier work, the running time is independent of q. The dependence on cutwidth is optimal: we prove that even 3-Coloring cannot be solved in O^*((2-epsilon)^{{ctw}}) time assuming SETH. Our algorithms rely on a new rank bound for a matrix that describes compatible colorings. Combined with a simple communication protocol for evaluating a product of two polynomials, this also yields an O^*((floor[d/2]+1)^{{pw}}) time randomized algorithm for q-Coloring on graphs of pathwidth {pw} and maximum degree d. Such a runtime was first obtained by Björklund, but only for graphs with few proper colorings. We also prove that this result is optimal in the sense that no O^*((floor[d/2]+1-epsilon)^{{pw}})-time algorithm exists assuming SETH.

Cite as

Bart M. P. Jansen and Jesper Nederlof. Computing the Chromatic Number Using Graph Decompositions via Matrix Rank. In 26th Annual European Symposium on Algorithms (ESA 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 112, pp. 47:1-47:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.ESA.2018.47,
  author =	{Jansen, Bart M. P. and Nederlof, Jesper},
  title =	{{Computing the Chromatic Number Using Graph Decompositions via Matrix Rank}},
  booktitle =	{26th Annual European Symposium on Algorithms (ESA 2018)},
  pages =	{47:1--47:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-081-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{112},
  editor =	{Azar, Yossi and Bast, Hannah and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2018.47},
  URN =		{urn:nbn:de:0030-drops-95104},
  doi =		{10.4230/LIPIcs.ESA.2018.47},
  annote =	{Keywords: Parameterized Complexity, Chromatic Number, Graph Decompositions}
}
Document
Subexponential Time Algorithms for Embedding H-Minor Free Graphs

Authors: Hans L. Bodlaender, Jesper Nederlof, and Tom C. van der Zanden

Published in: LIPIcs, Volume 55, 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)


Abstract
We establish the complexity of several graph embedding problems: Subgraph Isomorphism, Graph Minor, Induced Subgraph and Induced Minor, when restricted to H-minor free graphs. In each of these problems, we are given a pattern graph P and a host graph G, and want to determine whether P is a subgraph (minor, induced subgraph or induced minor) of G. We show that, for any fixed graph H and epsilon > 0, if P is H-Minor Free and G has treewidth tw, (induced) subgraph can be solved 2^{O(k^{epsilon}*tw+k/log(k))}*n^{O(1)} time and (induced) minor can be solved in 2^{O(k^{epsilon}*tw+tw*log(tw)+k/log(k))}*n^{O(1)} time, where k = |V(P)|. We also show that this is optimal, in the sense that the existence of an algorithm for one of these problems running in 2^{o(n/log(n))} time would contradict the Exponential Time Hypothesis. This solves an open problem on the complexity of Subgraph Isomorphism for planar graphs. The key algorithmic insight is that dynamic programming approaches can be sped up by identifying isomorphic connected components in the pattern graph. This technique seems widely applicable, and it appears that there is a relatively unexplored class of problems that share a similar upper and lower bound.

Cite as

Hans L. Bodlaender, Jesper Nederlof, and Tom C. van der Zanden. Subexponential Time Algorithms for Embedding H-Minor Free Graphs. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 9:1-9:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{bodlaender_et_al:LIPIcs.ICALP.2016.9,
  author =	{Bodlaender, Hans L. and Nederlof, Jesper and van der Zanden, Tom C.},
  title =	{{Subexponential Time Algorithms for Embedding H-Minor Free Graphs}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{9:1--9:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Chatzigiannakis, Ioannis and Mitzenmacher, Michael and Rabani, Yuval and Sangiorgi, Davide},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2016.9},
  URN =		{urn:nbn:de:0030-drops-62756},
  doi =		{10.4230/LIPIcs.ICALP.2016.9},
  annote =	{Keywords: subgraph isomorphism, graph minors, subexponential time}
}
Document
Exponential Time Paradigms Through the Polynomial Time Lens

Authors: Andrew Drucker, Jesper Nederlof, and Rahul Santhanam

Published in: LIPIcs, Volume 57, 24th Annual European Symposium on Algorithms (ESA 2016)


Abstract
We propose a general approach to modelling algorithmic paradigms for the exact solution of NP-hard problems. Our approach is based on polynomial time reductions to succinct versions of problems solvable in polynomial time. We use this viewpoint to explore and compare the power of paradigms such as branching and dynamic programming, and to shed light on the true complexity of various problems. As one instantiation, we model branching using the notion of witness compression, i.e., reducibility to the circuit satisfiability problem parameterized by the number of variables of the circuit. We show this is equivalent to the previously studied notion of `OPP-algorithms', and provide a technique for proving conditional lower bounds for witness compressions via a constructive variant of AND-composition, which is a notion previously studied in theory of preprocessing. In the context of parameterized complexity we use this to show that problems such as Pathwidth and Treewidth and Independent Set parameterized by pathwidth do not have witness compression, assuming NP subseteq coNP/poly. Since these problems admit fast fixed parameter tractable algorithms via dynamic programming, this shows that dynamic programming can be stronger than branching, under a standard complexity hypothesis. Our approach has applications outside parameterized complexity as well: for example, we show if a polynomial time algorithm outputs a maximum independent set of a given planar graph on n vertices with probability exp(-n^{1-epsilon}) for some epsilon>0, then NP subseteq coNP/poly. This negative result dims the prospects for one very natural approach to sub-exponential time algorithms for problems on planar graphs. As two other illustrations (more exploratory) of our approach, we model algorithms based on inclusion-exclusion or group algebras via the notion of "parity compression", and we model a subclass of dynamic programming algorithms with the notion of "disjunctive dynamic programming". These models give us a way to naturally classify various parameterized problems with FPT algorithms. In the case of the dynamic programming model, we show that Independent Set parameterized by pathwidth is complete for this model.

Cite as

Andrew Drucker, Jesper Nederlof, and Rahul Santhanam. Exponential Time Paradigms Through the Polynomial Time Lens. In 24th Annual European Symposium on Algorithms (ESA 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 57, pp. 36:1-36:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{drucker_et_al:LIPIcs.ESA.2016.36,
  author =	{Drucker, Andrew and Nederlof, Jesper and Santhanam, Rahul},
  title =	{{Exponential Time Paradigms Through the Polynomial Time Lens}},
  booktitle =	{24th Annual European Symposium on Algorithms (ESA 2016)},
  pages =	{36:1--36:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-015-6},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{57},
  editor =	{Sankowski, Piotr and Zaroliagis, Christos},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2016.36},
  URN =		{urn:nbn:de:0030-drops-63871},
  doi =		{10.4230/LIPIcs.ESA.2016.36},
  annote =	{Keywords: exponential time paradigms, branching, dynamic programming, lower bounds}
}
Document
Finding Large Set Covers Faster via the Representation Method

Authors: Jesper Nederlof

Published in: LIPIcs, Volume 57, 24th Annual European Symposium on Algorithms (ESA 2016)


Abstract
The worst-case fastest known algorithm for the Set Cover problem on universes with n elements still essentially is the simple O^*(2^n)-time dynamic programming algorithm, and no non-trivial consequences of an O^*(1.01^n)-time algorithm are known. Motivated by this chasm, we study the following natural question: Which instances of Set Cover can we solve faster than the simple dynamic programming algorithm? Specifically, we give a Monte Carlo algorithm that determines the existence of a set cover of size sigma*n in O^*(2^{(1-Omega(sigma^4))n}) time. Our approach is also applicable to Set Cover instances with exponentially many sets: By reducing the task of finding the chromatic number chi(G) of a given n-vertex graph G to Set Cover in the natural way, we show there is an O^*(2^{(1-Omega(sigma^4))n})-time randomized algorithm that given integer s = sigma*n, outputs NO if chi(G) > s and YES with constant probability if \chi(G) <= s - 1. On a high level, our results are inspired by the "representation method" of Howgrave-Graham and Joux~[EUROCRYPT'10] and obtained by only evaluating a randomly sampled subset of the table entries of a dynamic programming algorithm.

Cite as

Jesper Nederlof. Finding Large Set Covers Faster via the Representation Method. In 24th Annual European Symposium on Algorithms (ESA 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 57, pp. 69:1-69:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{nederlof:LIPIcs.ESA.2016.69,
  author =	{Nederlof, Jesper},
  title =	{{Finding Large Set Covers Faster via the Representation Method}},
  booktitle =	{24th Annual European Symposium on Algorithms (ESA 2016)},
  pages =	{69:1--69:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-015-6},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{57},
  editor =	{Sankowski, Piotr and Zaroliagis, Christos},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2016.69},
  URN =		{urn:nbn:de:0030-drops-64114},
  doi =		{10.4230/LIPIcs.ESA.2016.69},
  annote =	{Keywords: Set Cover, Exact Exponential Algorithms, Fine-Grained Complexity}
}
Document
Dense Subset Sum May Be the Hardest

Authors: Per Austrin, Petteri Kaski, Mikko Koivisto, and Jesper Nederlof

Published in: LIPIcs, Volume 47, 33rd Symposium on Theoretical Aspects of Computer Science (STACS 2016)


Abstract
The SUBSET SUM problem asks whether a given set of n positive integers contains a subset of elements that sum up to a given target t. It is an outstanding open question whether the O^*(2^{n/2})-time algorithm for SUBSET SUM by Horowitz and Sahni [J. ACM 1974] can be beaten in the worst-case setting by a "truly faster", O^*(2^{(0.5-delta)*n})-time algorithm, with some constant delta > 0. Continuing an earlier work [STACS 2015], we study SUBSET SUM parameterized by the maximum bin size beta, defined as the largest number of subsets of the n input integers that yield the same sum. For every epsilon > 0 we give a truly faster algorithm for instances with beta <= 2^{(0.5-epsilon)*n}, as well as instances with beta >= 2^{0.661n}. Consequently, we also obtain a characterization in terms of the popular density parameter n/log_2(t): if all instances of density at least 1.003 admit a truly faster algorithm, then so does every instance. This goes against the current intuition that instances of density 1 are the hardest, and therefore is a step toward answering the open question in the affirmative. Our results stem from a novel combinatorial analysis of mixings of earlier algorithms for SUBSET SUM and a study of an extremal question in additive combinatorics connected to the problem of Uniquely Decodable Code Pairs in information theory.

Cite as

Per Austrin, Petteri Kaski, Mikko Koivisto, and Jesper Nederlof. Dense Subset Sum May Be the Hardest. In 33rd Symposium on Theoretical Aspects of Computer Science (STACS 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 47, pp. 13:1-13:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{austrin_et_al:LIPIcs.STACS.2016.13,
  author =	{Austrin, Per and Kaski, Petteri and Koivisto, Mikko and Nederlof, Jesper},
  title =	{{Dense Subset Sum May Be the Hardest}},
  booktitle =	{33rd Symposium on Theoretical Aspects of Computer Science (STACS 2016)},
  pages =	{13:1--13:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-001-9},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{47},
  editor =	{Ollinger, Nicolas and Vollmer, Heribert},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2016.13},
  URN =		{urn:nbn:de:0030-drops-57143},
  doi =		{10.4230/LIPIcs.STACS.2016.13},
  annote =	{Keywords: subset sum, additive combinatorics, exponential-time algorithm, homo-morphic hashing, littlewood–offord problem}
}
Document
Subset Sum in the Absence of Concentration

Authors: Per Austrin, Petteri Kaski, Mikko Koivisto, and Jesper Nederlof

Published in: LIPIcs, Volume 30, 32nd International Symposium on Theoretical Aspects of Computer Science (STACS 2015)


Abstract
We study the exact time complexity of the Subset Sum problem. Our focus is on instances that lack additive structure in the sense that the sums one can form from the subsets of the given integers are not strongly concentrated on any particular integer value. We present a randomized algorithm that runs in O(2^0.3399nB^4) time on instances with the property that no value can arise as a sum of more than B different subsets of the n given integers.

Cite as

Per Austrin, Petteri Kaski, Mikko Koivisto, and Jesper Nederlof. Subset Sum in the Absence of Concentration. In 32nd International Symposium on Theoretical Aspects of Computer Science (STACS 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 30, pp. 48-61, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{austrin_et_al:LIPIcs.STACS.2015.48,
  author =	{Austrin, Per and Kaski, Petteri and Koivisto, Mikko and Nederlof, Jesper},
  title =	{{Subset Sum in the Absence of Concentration}},
  booktitle =	{32nd International Symposium on Theoretical Aspects of Computer Science (STACS 2015)},
  pages =	{48--61},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-78-1},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{30},
  editor =	{Mayr, Ernst W. and Ollinger, Nicolas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2015.48},
  URN =		{urn:nbn:de:0030-drops-49034},
  doi =		{10.4230/LIPIcs.STACS.2015.48},
  annote =	{Keywords: subset sum, additive combinatorics, exponential-time algorithm, homomorphic hashing, Littlewood--Offord problem}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail